![]() METHOD AND SYSTEM FOR DETERMINING A THRESHOLD FOR PIXEL CLASSIFICATION
专利摘要:
determine the pixel classification threshold for vehicle occupancy detection. The present invention relates to a system and method for determining a pixel rating threshold for determining vehicle occupancy. a go image of a moving vehicle is captured using a multi-band go image system. the driver's face is detected using a face recognition algorithm. the multi-spectrum information extracted from the pixels identified as human tissue on the driver's face is used to determine a pixel classification threshold. said threshold is then used to facilitate pixel classification of the rest of the ir image. once the pixels in the rest of the image have been classified, a determination can be made whether the vehicle contains additional human occupants in addition to the driver. an authority is alerted in the event that the vehicle is observed traveling on a hov/hot line that requires two human occupants, and the determination has been made that the vehicle contains an insufficient number of human occupants. 公开号:BR102012031758B1 申请号:R102012031758-3 申请日:2012-12-13 公开日:2021-06-01 发明作者:Yao Rong Wang;Beilei Xu;Peter Paul 申请人:Xerox Corporation; IPC主号:
专利说明:
Manual application of HOV/HOT lines by law enforcement can be difficult and potentially dangerous. Carrying drivers who incur fine violations tends to disrupt traffic and can become a safety hazard, not only for the officer, but also for the vehicle occupant. Consequently, automated occupancy detection (i.e., the ability to automatically detect human occupants of vehicles), preferably coupled with automated vehicle recognition and ticketing, is desirable. Further development on this technique is needed for automated solutions to determine the number of human occupants in a motor vehicle. While ordinary visible light can be used for vehicle occupancy detection through the front windshield under ideal conditions, there are drawbacks. For example, cab penetration with visible light can easily be compromised by factors such as colored windshields as well as environmental conditions such as rain, snow, dirt, and the like. Additionally, visible lighting at night can distract drivers. Near-infrared illumination has advantages over visible light illumination, including being unobservable by drivers. The development of this technique is an advance as methods that are needed to analyze captured IR images of a moving motor vehicle and to process said image in order to determine a total number of human occupants in that vehicle. Vehicle occupancy detection methods often rely on prior knowledge of the skin's reflectance spectrum to determine a human occupant in an IR image. Although the aforementioned methods of occupancy detection using spectral information from the skin in a multi-band camera system can provide accurate occupancy detection, they require readjusting a value for a threshold of comparison when environmental conditions change. Brief Description of Drawings Figure 1 illustrates an embodiment of an example IR lighting system 100; Figure 2 illustrates an embodiment of an example IR detection system 200; Figure 3 shows an example of vehicle occupancy detection system incorporating the IR lighting system of Figure 1 and the IR detection system of Figure 2; Figure 4 shows a front windshield area that has been taken from an image of the motor vehicle of Figure 3, taken using the IR illuminator and IR detector of Figures 1 and 2, respectively; Figure 5 is a flow diagram illustrating an exemplary embodiment of the present method for determining a threshold for pixel classification in the vehicle occupancy detection system of Figure 3; Figure 6 is a continuation of the flow diagram of Figure 5 with continuous flow processing with respect to nodes A and B; Figure 7 is a block diagram of an example system capable of implementing various aspects of the present method shown and described in relation to the flow diagrams of Figures 5 and 6. Detailed Description A pixel classification system and method is described that uses information from multiple pixel spectra, known to belong to the driver's human tissue, to obtain a threshold value which, in turn, is used to classify pixels in a remaining portion of the image so that a number of human occupants in the vehicle can be determined. It should be noted that in many countries automobiles are designed so that the driver sits on the right side, and the passenger sits on the left side of the front passenger compartment, considering the point of view of who is standing in front of the vehicle and looks at the front windshield. In other countries, automobiles are designed so that the driver is on the left side and the passenger is on the right side of the front passenger compartment from the same point of view. As such, any discussion here with reference to the left and right sides of the passenger compartment is intended to cover both configurations and should not be seen as limiting in any way. An "image of a motor vehicle"' means both a still image and video images of a motor vehicle obtained using an IR imaging system. A single array of a fully populated IR image consists of a pixel structure, in which each has a respective intensity value measured in a desired spectral wavelength band of interest.. A "multiband IR imaging system" is an apparatus comprising an IR illuminator and an IR detector designed to capture IR light reflected from a target object, to separate it into its component wavelengths, and output an IR image of the target. The IR image is captured over multiple wavelength ranges of interest. An IR imaging system can be a single IR detection device and a sequentially illuminated N-band illuminator (N £ 3) with a fixed filter, or comprise a total N detector (N £ 3) in which each it has a corresponding bandpass filter and a single illuminator. An example IR lighting system is shown in figure 1. An example IR detection system is shown in figure 2. A 'pixef classification threshold' is a value that identifies the pixel separation line that can be categorized as human fabric from pixels categorized as other materials such as, for example, seat fabric, leather, metals, and plastics, etc. A "correlation method" refers to a pixel classification method in which the pixels of an IR image are classified as human tissue based on an amount of correlation between the captured intensity of that pixel and the (scaled) intensity from a model that comprises: where i = 1.2 ...N represents the IR band ith using ith filter, F£(Â), ZSG0 is the energy spectrum of the IR illuminator, is the reflectance of the object inside the vehicle, TG (Â) is the vehicle glass window transmission coefficient, z>(A) is the responsiveness of the IR detector, eÀ2 specifies the wavelength range over which the camera detector integrates light, and the constant a depends on the angle e distance from the IR illuminator, pixel size, and camera integration time. The correlation method uses a spectral materials database containing premeasured reflectances of known materials such as human skin and hair, and other materials of interest. The database includes the windshield glass transmission, the energy spectrum of the IR illuminator(s), the filter transmission coefficients, and the responsiveness curve of the IR detector(s). GO. A theoretical pixel intensity for each object in the image can be calculated, and the measured intensity of each pixel can be compared to the theoretical intensities to determine an amount of correlation between them. In one modality, the correlation coefficient (for each pixel) is given by: where ZcmCO is the captured intensity of a pixel from the ith wavelength range, ics(í) is the intensity of a pixel of the driver's human tissue, and N is the total number of wavelength ranges of the driver system. multi-band IR image. If the driver's facial pixel intensity (with a particular reflectance) matches the measured pixel intensity of the object, then the correlation will be high (close to 1). Otherwise the correlation will be low (close to 0 or negative). Pixels are ranked based on comparison with a threshold given by: where C is an average of driver face pixel correlation coefficient values, Vc is the correlation variance, and β is the predetermined constant that balances a percentage of human tissue pixels to be included before too many non-skin pixels are included also. A “ratio method” is a pixel classification method that classifies it as human tissue, relative to other materials, if the aspect ratio (from Equation 5) is greater or less than a predetermined threshold value. In one modality, said proportion comprises: where i,j are any N-band indices different from each other (see Equation 1). Pixels are ranked based on comparison to a pixel ranking threshold given by: where sm is an average of the driver's face pixel intensity ratio values, vB is the ratio change, and y is the default constant that balances a percentage of human tissue pixels to include before many non-human tissue pixels be included as well. Example of IR illuminator Reference is now made to Figure 1 which illustrates an embodiment of an example IR lighting system 100 for use in accordance with the teachings herein. The IR illumination system of Figure 1 is shown, which comprises a plurality of IR light sources 102, each emitting a narrow band of IR radiation at a respective peak wavelength (Àj, ..., AJ. The light source 102 is a structure of light emitting diodes (LEDs). Each diode is preselected for output radiation in a particular wavelength band of interest, and defines a source in the array for that wavelength band Controller 108 is coupled to source arrangement 102 and controls the input current to each illuminator and thus the emission intensity of each. Optical readout elements 104 have optical elements 103 that match the wavelengths to produce IR illumination beam 106. Sensor 110 samples the radiation emitted from the array of IR light sources and provides feedback to controller 108. Focusing optical elements 112 receive beam 106 and focus the em beam. ission 114 on vehicle 116. Optical elements 112 include a plurality of lenses of varying focal lengths positioned in the path of the beam to focus the beam as desired. Controller 108 is also coupled to optical elements 112 to effect changes in emission beam 114 due to size, distance, speed of the target, to name a few restrictions. Controller 108 is in communication with storage device 109 to store/retrieve calibration information, intensity levels, and the like, including machine readable data and program instructions. Controller 108 may comprise a computer system such as a desktop, laptop, server, mainframe, and the like, or a special purpose computer system such as an ASIC. Controller 108 may be arranged in wired or wireless communication with the computing workstation on the network. This can be a local area network (LAN) or the Internet. It should be noted that any of the components of the IR lighting system 100 can be arranged in communication with said computing system to facilitate the goals intended here. Any of the optical elements described in relation to the IR illumination system 100 of Figure 1 can be replaced by an optical system that has optical power, and can additionally include mirrors. Said optical system may include multiple components which each have optical power, for example it may be a double triple lens. To the extent that said optical systems define a single focal length F, the source and network arrangement would be positioned in the front and rear focal planes of the optical elements. As a result, the optical system forms an image of the lattice at infinity with respect to each element of a light source array, and thus each source element sees the same region of the lattice. The light from each element would be coextensive in that region. The network can then produce output radiation, whose spectral content is substantially uniform across its transverse profile by compensating for the scatter associated with the lateral position of the different wavelength band sources. This, in turn, allows the spectral content of emission beam 114 to be substantially uniform across its transverse profile. In practice, it can be difficult to accurately define a desired focal length for the optical system due to aberrations (eg, field curvature, axial chromatic, lateral chromatic, distortion, coma, and the like) that can cause optical elements focus the rays to relatively different positions, according to their wavelengths or their lateral positions. Additionally, the relative positions of the optical system, the source array, and the lattice are selected according to the more general condition in which the optics forms an image of the lattice at infinity, with respect to each source element of the array. light source, at least for the paraxial rays that emerge from each source. For a ray that propagates at an angle □ to the optical axis, a paraxial ray has: □□□(□) « □. This infinite condition can be achieved by positioning each source element at a nominal rear focal plane of the optics for its depth of field, and placing the array at a nominal front focal plane of the optics for its depth of field. Depth of field (DOV) is related to the numerical aperture (NA) of the optical system according to: DOV=À/NA2, where À is the wavelength of light from the source element. Optical elements can be designed with components to provide multiple degrees of freedom to compensate for various optical aberrations. While the additional components in the optics provide additional degrees of freedom to reduce aberrations, each additional component also adds cost and complexity to the optics. IR detector example Reference is now made to Figure 2, which illustrates an embodiment of an example IR detection system 200 for use in accordance with the teachings herein. Target vehicle 116 reflects the IR emission beam 114 emitted by the focusing optical elements 112 of the IR illumination system of Figure 1. A portion of the reflected IR light is received by the optical elements 202, which have lenses 203 that focus the light received on sensors) 204. These spatially resolve the received light to obtain an IR image 208. Optical elements 202 may also include one or more bandpass filters that allow light only in a narrow wavelength band to pass through of the filter. Filters can also be changed sequentially to obtain N intensities at 208. Sensor 204 sends the IR image information to computer 206 for processing and storage. Detector 208 is a multispectral image detection device whose spectral content is selectable through a controller (not shown). Detector 204 records light intensity at multiple pixel locations along a two-dimensional grid. Optical elements 202 and the detector 204 include components commonly found in many streams of commerce. Suitable sensors include the following detectors: charge-coupled device (CCD), complementary metal oxide semiconductor (CMOS), charge injection device (CID), vidicon, reticon, image intensifier tube detectors , pixelated photomultiplier tube (PMT); InGaAs (Indium Gallium Arsenide), Mercury Cadmium Telluride (MCT), and Microbolometer. Computer 206 is in communication with optical elements 202 to control the lens thereof and is in communication with detector 204 to control the sensitivity thereof. Computer 206 receives sensitivity values associated with each pixel of an IR image 208. Computer 206 includes a keyboard, monitor, printer, etc. (not shown) insofar as they are needed to effect a control of the various elements of the IR detection system 200. Reference is now made to Figure 3, which shows an example of the vehicle occupancy detection system. This incorporates the IR illumination system of Figure 1 and the IR detection system of Figure 2, in which various aspects of the teachings herein find their intended use. In Figure 3, vehicle 300 is driver 302. Vehicle 300 is traveling at speed v in a direction of motion indicated by vector 303 along HOV line 304, and is intended to be processed here for vehicle occupancy detection. Positioned at a distance d above line 304 is a support arm 305 which, in various embodiments, comprises a tubular construction similar to that used to support traffic lights and street lights. Attached to support arm 305 is the IR detection system 307 and the IR lighting system 306. Systems 306 and 307 are intended to represent the modalities discussed with respect to figures 1 and 2, respectively. The 306 IR illuminator emits IR light at desired wavelengths of interest. The IR detector 307 is shown and has a Tx/Rx 308 element for communicating the captured IR images and/or pixel intensity values to a remote device such as the system of Figure 7 for processing in accordance with the teachings here. The system of Figure 3 is preferably configured so that the IR detector's field of view (FOV) covers a single line. Operationally, a vehicle enters the detection zone and the IR camera is activated via a trigger mechanism such as, for example, an underground sensor, a laser beam, and the like. Motor vehicle images are captured for processing. It should be noted that the various elements in Figure 3 are illustrative and are shown for the purpose of explanation. Figure 4 illustrates an example of the motor vehicle front passenger compartment 400, taken from the front of the vehicle through the front windshield 402. The front passenger seat 405 is unoccupied while, in the driver's seat 406, is driver 407. Windshield area 402 was taken from the captured IR image of motor vehicle 300 by the IR imaging system of Figure 3 as that vehicle traversed the HOV/HOT line 304. The windshield 402 is shown divided by a divider 401 on the driver's side and on the passenger side of the vehicle's front passenger compartment. The driver's head (shown in region 403) was isolated in the image using a face detection algorithm. In accordance with the teachings here, the pixel intensity values in the driver's facial region are used to make a determination of a pixel classification threshold which, in turn, is used to classify pixels in the remaining portion of the IR image. so that the total number of human occupants in the vehicle can be determined. Reference is now made to the flow diagram of Figure 5, which illustrates an exemplary embodiment of the present method for determining a threshold for pixel classification. Stream processing starts at 500 and immediately proceeds to step 502. In step 502, an intended IR image of a moving vehicle, analyzed for human occupancy detection, is captured using a multi-band IR imaging system comprising an IR detector and an IR illuminator. An example IR lighting system is shown and discussed in relation to Figure 1. An example IR detection system is shown and discussed in relation to Figure 2. An example vehicle occupancy detection system, used for capturing -rating an IR image of a motor vehicle traveling on a HOV/HOT line is shown and discussed in relation to Figure 3. The IR image can be received from a transmitter 308 of Figure 3, or from a remote device on a network. Once an example network connected workstation is shown and discussed in relation to Figure 7. Captured images can be processed automatically or stored in a memory or storage device for processing, according to the teachings here. The intention is that said modalities fall within the scope of the appended claims. An area of the front windshield can be removed from the image for processing, such as that of Figure 4. Techniques for identifying, isolating, and cropping a portion of an image are well established in the image processing art. Therefore, a discussion of a particular method has been omitted here. The windshield area can be manually removed using, for example, an image manipulation program tool and a graphical user interface from a computer workstation. In step 504, the vehicle driver's facial area is identified in the driver's portion of the front passenger compartment. The driver's face can be identified in the image using a facial recognition software algorithm. These algorithms provide high recognition capability, especially if a candidate region of the image, where the face can be found, has already been identified, for example, by dividing the vehicle's front windshield area into an area on the side of the vehicle. passenger and one on the driver's side and, once divided, isolate the candidate region where the driver's head can be found. The candidate region can be identified in the driver's half of the front passenger compartment, using the known locations of one or more parts of the car such as, for example, a steering wheel, rear view or side mirror, a driver's seat headrest, It's similar. In step 506, human tissue pixels from the driver's head are detected in an IR image at the location identified by the facial recognition algorithm. It should be noted that human tissue pixels need not be those of the driver's face, but could be from the driver's hand positioned on the steering wheel, or a portion of the driver's arm that has been detected or otherwise identified in the image. In step 508, the information from multiple spectra of the driver's human tissue pixels (from step 506) is used to determine the pixel rating threshold in a previously discussed mode. In step 510, the pixel rating threshold is used to rate the human tissue pixels in the remainder of the image. In step 512, a number of human occupants in the vehicle is determined based on the ranked pixels. At step 514, the determination is made whether the vehicle contains at least two human occupants. If it is determined that the vehicle does not contain at least two human occupants, then processing continues with respect to node A of Figure 6 where, in step 516, a determination is made that a traffic violation has occurred. The determination can be readily made by comparing the total number of human occupants determined for this vehicle with the number of occupants required in the vehicle at the time the vehicle was detected on the HOV/HOT line. In step 518, a law enforcement or traffic authority is notified that the vehicle in the image is traveling on the HOV/HOT line at a specified time of day, without the required number of human occupants. Law enforcement can then isolate the license number from that vehicle's license plate and issue a traffic citation to the registered owner of the vehicle. Thereafter, flow processing continues with respect to node C where, in step 502, another intended IR image of a motor vehicle to be processed for vehicle occupancy detection is captured or is otherwise received, and processing repeats in a similar mode for the next image. If, in step 514, it is determined that said vehicle contains at least two human occupants, then processing continues with respect to node B where, in step 520, a determination is made that no traffic violation has occurred. This particular image can be stored in a storage device for a predetermined period of time or summarily discarded. Stream processing continues with respect to node C of Fig. 5 where, in step 502, another IR image of a motor vehicle is received for processing. Processing is repeated in a similar fashion until there are no more images or until the system is out of area for maintenance or repair. In another embodiment, further processing stops after a predetermined number of images have been processed. It should be noted that various aspects of the flow diagram modality of Figures 5-6 are intended to be used in HOV/HOT detection systems, where a violation occurs when the motor vehicle does not contain at least two passengers. Some traffic authorities require more than two passengers to authorize traffic for a particular vehicle on their respective HOV/HOT lines during a particular time of day. It is intended to apply these methods to these situations as well. It should be understood that the flow diagrams described here are illustrative. One or more of the operations illustrated in the flow diagrams can be performed in a different order. Other features can be added, modified, augmented, or consolidated. It is intended to find variations thereof within the scope of the appended claims. All or portions of the flow diagrams can be implemented partially or completely in hardware, together with machine-executable instructions in communication with various components of a vehicle occupancy detection system. Reference is now made to Figure 7, which illustrates a block diagram of an example system capable of implementing various aspects of the present method shown and described in connection with the flow diagrams of Figures 5 and 6. In Fig. 7, workstation 700 is shown having been arranged in communication with receiver 702 to receive the IR image from antenna 308 of Fig. 3 in addition to effecting bidirectional communication between computer 700 and the detection system. of IR of Figure 3. Computer system 700 is shown, which comprises a screen 703 that allows the display of information to a data entry system by the user, and a keyboard 705, for the user to make a selection. such as, for example, identifying the windshield area of a received IR image or visually inspecting an image in the case where occupancy detection has failed. A user can use the graphical user interface to identify or select one or more portions of an IR image such as, for example, candidate area 403, where the driver's facial area can be found by a face detection algorithm, or for a user to manually identify the driver's facial region. Pixels identified or otherwise detected in the received image can be retrieved from a remote device on network 701. Various portions of the intended image, captured from the motor vehicle to be processed in accordance with the teachings herein, can be stored in a memory or a storage device that has been arranged in communication with workstation 700, or a remote device for further storage or processing on network 701 via a communication interface (not shown). Computer 700 and Tx/Rx element 702 are in communication with Image Processing Unit 706, which processes the received image in accordance with the teachings herein. Image Processing Unit 706 is shown, which comprises a buffer 707 for queuing received images for subsequent processing. Said buffer may also be configured to store data, formulas, pixel classification methods, and other variables and representations necessary to facilitate the processing of the received IR image in accordance with the methods described herein. Windshield Crop Module 708 receives the next IR image from Buffer 707, in addition to isolating and cropping the motor vehicle windshield area from the image for further processing (as shown in figure 4) . The 709 Face Recognition Processor performs facial recognition on the image to identify the driver's head and facial region in the image. Processor 709 may be configured to receive a candidate region of the front windshield portion of the IR image, likely to contain the driver's head and facial area (such as region 403 of Figure 4), and process that region. The location of the identified driver's facial region is provided to the Pixel Detector Module 710, which detects human tissue pixels in said locations. Multiple spectral information, associated with human tissue pixels, is provided to storage device 711. Several embodiments herein involve cross-referencing pixel intensity values with calculated intensity values using known reflectances obtained from a device. storage. In said embodiment, device 711 additionally contains information such as, for example, a power spectrum of the IR illuminator, a transmittance of a filter, and the response curve of the IR detector, and other information required by any of the pixel classification methods employed, as discussed in relation to Equations (1)-(6). Module 710 communicates the information of multiple spectra of human tissue pixels from the driver's facial region to Threshold Processor 712, which calculates a pixel classification threshold in accordance with various modalities described here, which depend on the pixel classification method. , whether it is correlation or proportion. The Pixel Rating Module receives the pixel rating threshold from Processor 712, and proceeds to rate the pixels in the remainder of the IR image or the remainder of the removed windshield area. Occupant Determiner 714 receives the ranked pixels from the Pixel Ranking Module, and proceeds to determine the number of human occupants in the vehicle by clustering the human tissue pixels and counting the number of pixel clusters (or drops) determined to be individual human occupants in the vehicle. The number of human occupants is provided to Violation Processor 715, which proceeds to determine if a traffic violation has occurred, considering the given number of occupants and the time of day the image was captured. If a traffic violation has occurred, then Processor 715 initiates a signal via the Tx/Rx 716 element to the law enforcement or traffic authority that a vehicle has been detected traveling on a HOV/HOT line without the number of human occupants. The signal can include the original image and one or more aspects of the processed image. Law enforcement can then act accordingly. It should be noted that any of the modules and/or processors in Figure 7 are in communication with workstation 700 and storage device 711 via communication paths (shown and not shown) and can store / retrieve data, 5 parameter values, functions, pages, registers, data, and machine readable/executable program instructions which are necessary to perform its various functions. Each can additionally be in communication with one or more remote devices on network 701. Connections between modules and processing units must include not only physical but also logical connections. It should be noted that some or all of the functionality of any of the modules or processing units in Figure 7 can be performed, in whole or in part, by the internal components of Workstation 700 or by a special purpose computer system.
权利要求:
Claims (20) [0001] 1. Method for determining a threshold for pixel classification in a vehicle occupancy detection system characterized in that it comprises the steps of: receiving an IR image of a moving vehicle intended to be analyzed for passenger occupancy in the vehicle , the image having been captured using a multi-band IR image system comprising an IR detector and an IR illuminator, the image system having collected intensity values for each pixel in the image; detect human tissue pixels from a vehicle driver in the IR image; and using the multi-spectral information of the detected human tissue pixels to determine a Cth pixel classification threshold comprising: [0002] 2. Method according to claim 1, characterized in that it further comprises using the pixel classification threshold to classify human tissue pixels in a remnant of the image, the pixels being classified according to a correlation method comprising : [0003] 3. Method according to claim 1, characterized in that it further comprises: cropping the image of an area of the vehicle's windshield; and analyzing the windshield area to determine an approximate location of the driver's side and one of the passenger's side of the vehicle's front passenger compartment. [0004] 4. Method according to claim 1, characterized in that it further comprises determining a number of human occupants in the vehicle based on pixel ratings. [0005] 5. Method according to claim 4, characterized in that it further comprises alerting an authority in the instance where a front passenger seat is determined not to be occupied by a human occupant and the vehicle is traveling on an HOV/ HOT. [0006] 6. System for determining a threshold for pixel classification in a vehicle occupancy detection system characterized in that it comprises: a multi-band IR image sensor for capturing an IR image of a moving vehicle intended to be analyzed for passenger occupancy in the vehicle, the captured IR image comprising collected intensity values for each pixel in the image; and a processor in communication with the image sensor and a memory, the processor executing machine readable instructions for carrying out the steps of: detecting human tissue pixels of a vehicle driver in the IR image; and using the multi-spectral information of the detected human tissue pixels to determine a Cth pixel classification threshold comprising: [0007] 7. System according to claim 6, characterized in that it further comprises using the pixel classification threshold to classify human tissue pixels in a remnant of the image, the pixels being classified according to a correlation method comprising : [0008] 8. System according to claim 6, characterized in that it further comprises: cropping the image of an area of the vehicle's windshield; and analyzing the windshield area to determine an approximate location of the driver's side and one of the passenger's side of the vehicle's front passenger compartment. [0009] 9. System according to claim 6, characterized in that it further comprises determining a number of human occupants in the vehicle based on pixel ratings. [0010] 10. System according to claim 6, characterized in that it further comprises alerting an authority in the instance where a front passenger seat is determined not to be occupied by a human occupant and the vehicle is traveling on an HOV/ HOT. [0011] 11. Method for determining a threshold for pixel classification in a vehicle occupancy detection system characterized in that it comprises the steps of: receiving an IR image of a moving vehicle intended to be analyzed for passenger occupancy in the vehicle , the image having been captured using a multi-band IR image system comprising an IR detector and an IR illuminator, the image system having collected intensity values for each pixel in the image; detect human tissue pixels from a vehicle driver in the IR image; and using the multi-spectral information of the detected human tissue pixels to determine a Sth pixel classification threshold comprising: [0012] 12. Method according to claim 11, characterized in that it further comprises using the pixel classification threshold to classify human tissue pixels in a remnant of the image, the pixels being classified according to a ratio method comprising : [0013] 13. Method according to claim 11, characterized in that it further comprises: cropping the image of an area of the vehicle's windshield; and analyzing the windshield area to determine an approximate location of the driver's side and one of the passenger's side of the vehicle's front passenger compartment. [0014] 14. Method according to claim 11, characterized in that it further comprises determining a number of human occupants in the vehicle based on pixel ratings. [0015] 15. Method according to claim 14, characterized in that it further comprises alerting an authority in the instance where a front passenger seat is determined not to be occupied by a human occupant and the vehicle is traveling on a HOV/lane HOT. [0016] 16. A system for determining a threshold for pixel classification in a vehicle occupancy detection system characterized in that it comprises: a multi-band IR image sensor for capturing an IR image of a moving vehicle intended to be analyzed for passenger occupancy in the vehicle, the captured IR image comprising collected intensity values for each pixel in the image; and a processor in communication with the image sensor and a memory, the processor executing machine readable instructions for carrying out the steps of: detecting human tissue pixels of a vehicle driver in the IR image; and using the multi-spectral information of the detected human tissue pixels to determine a Sth pixel classification threshold comprising: [0017] 17. System according to claim 16, characterized in that it further comprises using the pixel classification threshold to classify the human tissue pixels in a remnant of the image, the pixels being classified according to a relation method comprising : [0018] 18. System according to claim 16, characterized in that it further comprises: cropping the image of an area of the vehicle's windshield; and analyzing the windshield area to determine an approximate location of the driver's side and one of the passenger's side of the front passenger compartment of the vehicle. [0019] 19. System according to claim 16, characterized in that it further comprises determining a number of human occupants in the vehicle based on pixel ratings. [0020] 20. System according to claim 16, characterized in that it further comprises alerting an authority in the instance where a front passenger seat is determined not to be occupied by a human occupant and the vehicle is traveling on a lane HOV/HOT.
类似技术:
公开号 | 公开日 | 专利标题 BR102012031758B1|2021-06-01|METHOD AND SYSTEM FOR DETERMINING A THRESHOLD FOR PIXEL CLASSIFICATION JP5938125B2|2016-06-22|Method for determining the number of organisms in an IR image KR101924647B1|2018-12-03|Determining a number of objects in an ir image KR101940955B1|2019-01-21|Apparatus, systems and methods for improved facial detection and recognition in vehicle inspection security systems US9523608B2|2016-12-20|Material identification from a spectral filtered patterned image without demosaicing KR101030763B1|2011-04-26|Image acquisition unit, acquisition method and associated control unit EP2602640B1|2019-09-25|Vehicle occupancy detection using time-of-flight sensor US20050185243A1|2005-08-25|Wavelength selectivity enabling subject monitoring outside the subject's field of view US9196056B2|2015-11-24|Electro-optical system and method for analyzing images of a scene to identify the presence of a target color WO2008099146A1|2008-08-21|Method and apparatus for counting vehicle occupants KR102211903B1|2021-02-04|Method And Apparatus for Photographing for Detecting Vehicle Occupancy WO2005119571A1|2005-12-15|Imaging apparatus and method for vehicle occupant determination Izumi et al.2009|Development of Occupant Detection System Using Far-Infrared Ray | Camera Daley et al.2013|Detection of vehicle occupants in HOV lanes: exploration of image sensing for detection of vehicle occupants CN112622919B|2022-02-11|Drunk driving automatic monitoring system and detection method Holec et al.2013|Monitoring the Use of HOV and HOT Lanes US20200001880A1|2020-01-02|Driver state estimation device and driver state estimation method CN109318804A|2019-02-12|Rear-view system for automobile and method based on infrared imaging WO2019012085A1|2019-01-17|A vision system and vision method for a vehicle CN112782090A|2021-05-11|Drunk driving automatic monitoring system and detection method
同族专利:
公开号 | 公开日 DE102012221648A1|2013-06-13| CN103164708A|2013-06-19| BR102012031758A2|2015-04-14| US20130147959A1|2013-06-13| CN103164708B|2017-12-05| KR20130067235A|2013-06-21| US9202118B2|2015-12-01| KR101924649B1|2018-12-03| MX2012014267A|2013-06-17|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US6370260B1|1999-09-03|2002-04-09|Honeywell International Inc.|Near-IR human detector| US7469060B2|2004-11-12|2008-12-23|Honeywell International Inc.|Infrared face detection and recognition system| US7786897B2|2007-01-23|2010-08-31|Jai Pulnix, Inc.|High occupancy vehicle lane enforcement| GB0703031D0|2007-02-15|2007-03-28|Laser Optical Engineering Ltd|Method and apparatus for counting vehicle occupants| CN101770642A|2008-12-26|2010-07-07|深圳先进技术研究院|Method and system for counting number of people in car| CN101795397B|2010-01-27|2013-01-23|北京交通大学|Infrared imaging method for detecting passengers in vehicle| US9964643B2|2011-12-08|2018-05-08|Conduent Business Services, Llc|Vehicle occupancy detection using time-of-flight sensor|RU2591735C2|2011-09-20|2016-07-20|Халлибертон Энерджи Сервисез, Инк.|System and apparatus for detection of prohibited or hazardous substances| US9007438B2|2012-05-21|2015-04-14|Xerox Corporation|3D imaging using structured light for accurate vehicle occupancy detection| US9195908B2|2013-05-22|2015-11-24|Xerox Corporation|Snow classifier context window reduction using class t-scores and mean differences| TWI532620B|2013-06-24|2016-05-11|Utechzone Co Ltd|Vehicle occupancy number monitor and vehicle occupancy monitoring method and computer readable record media| US9336594B2|2014-03-07|2016-05-10|Xerox Corporation|Cardiac pulse rate estimation from source video data| US9320440B2|2014-04-01|2016-04-26|Xerox Corporation|Discriminating between atrial fibrillation and sinus rhythm in physiological signals obtained from video| US9652851B2|2014-04-01|2017-05-16|Conduent Business Services, Llc|Side window detection in near-infrared images utilizing machine learning| US9633267B2|2014-04-04|2017-04-25|Conduent Business Services, Llc|Robust windshield detection via landmark localization| US9177214B1|2014-04-10|2015-11-03|Xerox Corporation|Method and apparatus for an adaptive threshold based object detection| US9521335B2|2014-06-17|2016-12-13|Xerox Corporation|Detecting febrile seizure with a thermal video camera| US9552524B2|2014-09-15|2017-01-24|Xerox Corporation|System and method for detecting seat belt violations from front view vehicle images| US9928424B2|2016-02-22|2018-03-27|Conduent Business Services, Llc|Side window detection through use of spatial probability maps| EP3285231A4|2016-03-17|2019-02-20|Nec Corporation|Passenger counting device, system, method and program| US9864931B2|2016-04-13|2018-01-09|Conduent Business Services, Llc|Target domain characterization for data augmentation| US10530972B2|2016-09-21|2020-01-07|Htc Corporation|Control method for optical tracking system| US11169661B2|2017-05-31|2021-11-09|International Business Machines Corporation|Thumbnail generation for digital images| US10202048B2|2017-06-28|2019-02-12|Toyota Motor Engineering & Manufacturing North America, Inc.|Systems and methods for adjusting operation of a vehicle according to HOV lane detection in traffic| US20190180125A1|2017-12-08|2019-06-13|Gatekeeper Security, Inc.|Detection, counting and identification of occupants in vehicles| KR20200067053A|2018-12-03|2020-06-11|현대자동차주식회사|System and method for providing information about parking space|
法律状态:
2015-04-14| B03A| Publication of a patent application or of a certificate of addition of invention [chapter 3.1 patent gazette]| 2018-12-04| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]| 2019-10-15| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]| 2021-01-05| B08F| Application dismissed because of non-payment of annual fees [chapter 8.6 patent gazette]|Free format text: REFERENTE A 8A ANUIDADE. | 2021-04-13| B08G| Application fees: restoration [chapter 8.7 patent gazette]| 2021-05-04| B09A| Decision: intention to grant [chapter 9.1 patent gazette]| 2021-06-01| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 13/12/2012, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US13/324,308|2011-12-13| US13/324,308|US9202118B2|2011-12-13|2011-12-13|Determining a pixel classification threshold for vehicle occupancy detection| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|